Crawlers are programs that automatically scan the Internet or other network acquiring data. Numerous search programs utilize crawlers to build a database or index of existing Web sites--for example, the WebCrawler. Crawlers are also called wanderers, robots, and spiders (since they wander the "web").
Different crawlers search for and document different resources. Some may catalog titles, others URLs within HTML pages. None, however, examines the actual content of documents. Some allow the augmentation of their database by direct submission.
Major crawler search programs include:
In many ways, crawlers do for the Web what Archie does for anonymous ftp sites, or Veronica for gopher menus. Yet while these other database- building programs can identify specific sites, they cannot "crawl" from link to link within them.